Building a successful SEO (Search Engine Optimization) campaign
requires a lot of time and hard work. Search engines are constantly
changing their algorithms and it's up to you to make the necessary
adjustments to accommodate these changes. Keeping track of all of your
optimized pages can be a daunting task. However, you can avoid
unnecessary confusion by organizing your optimized pages in a
streamlined fashion. Although not common practice, this is one of the
most important steps in any successful SEO campaign.
What do I mean by "organized?" Simply, that you should develop a clear plan on how your pages will be named and where they will be situated on your web site. You need to be able to easily identify and track what pages have been indexed by what engine and what pages need to be updated. One way to achieve this is to adopt a "naming convention".
Example 1:
Your company web site sells widgets. You have a list of 5 of your most important keywords and you've optimized these keywords for 4 search engines. That's a total of 20 optimized pages. You have a robots.txt file set up to prevent search engine 'A' from indexing pages that are intended for search engine 'B' and so on.
Keyword | Page Name | Engine |
widgets | widgets.htm | |
blue widgets | bluewidgets.htm | |
red widgets | redwidgets.htm | |
black widgets | blackwidgets.htm | |
purple widgets | purplewidgets.htm | |
widgets | widgets2.htm | MSN |
blue widgets | bluewidgets2.htm | MSN |
red widgets | redwidgets2.htm | MSN |
black widgets | blackwidgets2.htm | MSN |
purple widgets | purplewidgets2.htm | MSN |
widgets | widgets3.htm | AltaVista |
blue widgets | bluewidgets3.htm | AltaVista |
red widgets | redwidgets3.htm | AltaVista |
black widgets | blackwidgets3.htm | AltaVista |
purple widgets | purplewidgets3.htm | AltaVista |
widgets | widgets4.htm | Hotbot |
blue widgets | bluewidgets4.htm | Hotbot |
red widgets | redwidgets4.htm | Hotbot |
black widgets | blackwidgets4.htm | Hotbot |
purple widgets | purplewidgets4.htm | Hotbot |
Let's examine the drawbacks to this naming convention:
Now, let's take a look how we can modify our page names in order to get credit for the keywords, and allow you to easily identify them in the corresponding search engine while gaining maximum exposure.
Example 2
Below, you'll see an example of how I have added hyphens to separate keywords in the page name. Also, I've appended an engine indicator to the file name, so it will be easy to distinguish what page is optimized for which engine.
Keyword | Page Name | Engine |
widgets | widgets.htm | |
blue widgets | blue-widgets-gg.htm | |
red widgets | red-widgets-gg.htm | |
black widgets | black-widgets-gg.htm | |
purple widgets | purple-widgets-gg.htm | |
widgets | widgets-ms.htm | MSN |
blue widgets | blue-widgets-ms.htm | MSN |
red widgets | red-widgets-ms.htm | MSN |
black widgets | black-widgets-ms.htm | MSN |
purple widgets | purple-widgets-ms.htm | MSN |
widgets | widgets-av.htm | AltaVista |
blue widgets | blue-widgets-av.htm | AltaVista |
red widgets | red-widgets-av.htm | AltaVista |
black widgets | black-widgets-av.htm | AltaVista |
purple widgets | purple-widgets-av.htm | AltaVista |
widgets | widgets-hb.htm | Hotbot |
blue widgets | blue-widgets-hb.htm | Hotbot |
red widgets | red-widgets-hb.htm | Hotbot |
black widgets | black-widgets-hb.htm | Hotbot |
purple widgets | purple-widgets-hb.htm | Hotbot |
I respectively use abbreviations such as "gg" for Google, "ms" for MSN, and so on. You don't have to use my abbreviations. However, make sure the naming convention that you implement is consistent. That's the most important thing.
Tip: Please be careful when creating an "engine indicator." Do not spell out the entire engine name in your filename. For instance, avoid naming your page like this:
blue-widgets-google.htm
Although it has not been proven, Google and other crawlers could potentially flag this page as a doorway page because it thinks you are creating it specifically to rank high on that engine.
You might be thinking, "I've created a robot.txt file, so I don't have to worry about search engine 'A' indexing pages that are intended for search engine 'B.' Yes, that is correct. However, if you use a robot.txt file for this purpose, you could be cheating yourself from gaining maximum exposure across all of the search engines.
If you do not use a robot.txt file, you will notice that search engine 'A' will index pages optimized for search engine 'B.' This is exactly what you want. In order to do this, you must be very careful because you do not want to have similar content that could be flagged as spam.
It is completely possible to optimize several different pages that target the same keyword, and create content so unique that you will not be flagged for spam. As I mentioned, this will maximize your exposure across all of the search engines, while allowing you to increase the overall unique content of your site.
I can't tell you how many times engine 'A' has picked up pages that I've optimized for engine 'B' and ranked the 'B' pages higher than those I specifically optimized for 'A.' So, if at all possible, only use a robot.txt file to protect your confidential content from being indexed.
One final Tip: Try to avoid creating sub directories solely for the purpose of storing optimized pages for a specific search engine. Storing all of your optimized pages in your root directory gives you a better chance at higher rankings because most crawlers give more weight to pages found in the root directory. In this case, it is better to sacrifice the organization and shoot for the higher rankings.
This article is copyrighted and has been reprinted with permission from Matt Paolini. Matt Paolini is a Webmaster/Tech Support Specialist for FirstPlace Software, the makers of WebPosition Gold. He's also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server Developer/Designer. For more information on his SEO services, please visit http://www.webtemplatestore.net/seo.aspx or send him an email at webmaster@webtemplatestore.net